23 research outputs found

    Low-energy Bluetooth for Detecting Real-world Penetrance of Bystander Naloxone Kits: A Pilot Study

    Get PDF
    Opioid overdose is a growing public health emergency in the United States. The antidote naloxone must be administered rapidly after opioid overdose to prevent death. Bystander or take-home naloxone programs distribute naloxone to opioid users and other community members to increase naloxone availability at the time of overdose. However, data describing the natural history of take- home naloxone in the hands of at-risk individuals is lacking. To understand patterns of naloxone uptake in at-risk users, we developed a smart naloxone kit that uses low-energy Bluetooth (BLE) to unobtrusively detect the transit of naloxone through a hospital campus. In this paper, we describe development of the smart naloxone kit and results from the first 10 participants in our pilot study

    Factors influencing terrestriality in primates of the Americas and Madagascar

    Get PDF
    Among mammals, the order Primates is exceptional in having a high taxonomic richness in which the taxa are arboreal, semiterrestrial, or terrestrial. Although habitual terrestriality is pervasive among the apes and African and Asian monkeys (catarrhines), it is largely absent among monkeys of the Americas (platyrrhines), as well as galagos, lemurs, and lorises (strepsirrhines), which are mostly arboreal. Numerous ecological drivers and species-specific factors are suggested to set the conditions for an evolutionary shift from arboreality to terrestriality, and current environmental conditions may provide analogous scenarios to those transitional periods. Therefore, we investigated predominantly arboreal, diurnal primate genera from the Americas and Madagascar that lack fully terrestrial taxa, to determine whether ecological drivers (habitat canopy cover, predation risk, maximum temperature, precipitation, primate species richness, human population density, and distance to roads) or species-specific traits (bodymass, group size, and degree of frugivory) associate with increased terrestriality. We collated 150,961 observation hours across 2,227 months from 47 species at 20 sites in Madagascar and 48 sites in the Americas. Multiple factors were associated with ground use in these otherwise arboreal species, including increased temperature, a decrease in canopy cover, a dietary shift away from frugivory, and larger group size. These factors mostly explain intraspecific differences in terrestriality. As humanity modifies habitats and causes climate change, our results suggest that species already inhabiting hot, sparsely canopied sites, and exhibiting more generalized diets, are more likely to shift toward greater ground use

    Finishing the euchromatic sequence of the human genome

    Get PDF
    The sequence of the human genome encodes the genetic instructions for human physiology, as well as rich information about human evolution. In 2001, the International Human Genome Sequencing Consortium reported a draft sequence of the euchromatic portion of the human genome. Since then, the international collaboration has worked to convert this draft into a genome sequence with high accuracy and nearly complete coverage. Here, we report the result of this finishing process. The current genome sequence (Build 35) contains 2.85 billion nucleotides interrupted by only 341 gaps. It covers ∼99% of the euchromatic genome and is accurate to an error rate of ∼1 event per 100,000 bases. Many of the remaining euchromatic gaps are associated with segmental duplications and will require focused work with new methods. The near-complete sequence, the first for a vertebrate, greatly improves the precision of biological analyses of the human genome including studies of gene number, birth and death. Notably, the human enome seems to encode only 20,000-25,000 protein-coding genes. The genome sequence reported here should serve as a firm foundation for biomedical research in the decades ahead

    Factors influencing terrestriality in primates of the Americas and Madagascar

    Get PDF
    Among mammals, the order Primates is exceptional in having a high taxonomic richness in which the taxa are arboreal, semiterrestrial, or terrestrial. Although habitual terrestriality is pervasive among the apes and African and Asian monkeys (catarrhines), it is largely absent among monkeys of the Americas (platyrrhines), as well as galagos, lemurs, and lorises (strepsirrhines), which are mostly arboreal. Numerous ecological drivers and species-specific factors are suggested to set the conditions for an evolutionary shift from arboreality to terrestriality, and current environmental conditions may provide analogous scenarios to those transitional periods. Therefore, we investigated predominantly arboreal, diurnal primate genera from the Americas and Madagascar that lack fully terrestrial taxa, to determine whether ecological drivers (habitat canopy cover, predation risk, maximum temperature, precipitation, primate species richness, human population density, and distance to roads) or species-specific traits (body mass, group size, and degree of frugivory) associate with increased terrestriality. We collated 150,961 observation hours across 2,227 months from 47 species at 20 sites in Madagascar and 48 sites in the Americas. Multiple factors were associated with ground use in these otherwise arboreal species, including increased temperature, a decrease in canopy cover, a dietary shift away from frugivory, and larger group size. These factors mostly explain intraspecific differences in terrestriality. As humanity modifies habitats and causes climate change, our results suggest that species already inhabiting hot, sparsely canopied sites, and exhibiting more generalized diets, are more likely to shift toward greater ground use

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Leveraging digital tools to support recovery from substance use disorder during the COVID-19 pandemic response

    No full text
    Treatment for substance use disorder (SUD) during the COVID-19 pandemic poses unique challenges, both due to direct effects from the illness, and indirect effects from the physical measures needed to flatten the curve. Stress, isolation, lack of structure, limited access to physical and mental health care, and changes in treatment paradigms all increase risk of return to drug use events and pose barriers to recovery for people with SUDs. The pandemic has forced treatment providers and facilities to rapidly adapt to address these threats while redesigning their structure to accommodate physical distancing regulations. Digital health interventions can function without the need for physical proximity. Clinicians can use digital health intervention, such as telehealth, wearables, mobile applications, and other remote monitoring devices, to convert in-person care to remote-based care, and they can leverage these tools to address some of the pandemic-specific challenges to treatment. The current pandemic provides the opportunity to rapidly explore the advantages and limitations of these technologies in the care of individuals with SUD

    OpiTrack: A Wearable-based Clinical Opioid Use Tracker with Temporal Convolutional Attention Networks

    No full text
    Opioid use disorder is a medical condition with major social and economic consequences. While ubiquitous physiological sensing technologies have been widely adopted and extensively used to monitor day-to-day activities and deliver targeted interventions to improve human health, the use of these technologies to detect drug use in natural environments has been largely underexplored. The long-term goal of our work is to develop a mobile technology system that can identify high-risk opioid-related events (i.e., development of tolerance in the setting of prescription opioid use, return-to-use events in the setting of opioid use disorder) and deploy just-in-time interventions to mitigate the risk of overdose morbidity and mortality. In the current paper, we take an initial step by asking a crucial question: Can opioid use be detected using physiological signals obtained from a wrist-mounted sensor? Thirty-six individuals who were admitted to the hospital for an acute painful condition and received opioid analgesics as part of their clinical care were enrolled. Subjects wore a noninvasive wrist sensor during this time (1-14 days) that continuously measured physiological signals (heart rate, skin temperature, accelerometry, electrodermal activity, and interbeat interval). We collected a total of 2070 hours ( approximately 86 days) of physiological data and observed a total of 339 opioid administrations. Our results are encouraging and show that using a Channel-Temporal Attention TCN (CTA-TCN) model, we can detect an opioid administration in a time-window with an F1-score of 0.80, a specificity of 0.77, sensitivity of 0.80, and an AUC of 0.77. We also predict the exact moment of administration in this time-window with a normalized mean absolute error of 8.6% and R (2) coefficient of 0.85

    Understanding Naloxone Uptake from an Emergency Department Distribution Program Using a Low-Energy Bluetooth Real-time Location System

    No full text
    INTRODUCTION: Emergency department (ED)-based naloxone distribution programs are a widespread harm reduction strategy. However, data describing the community penetrance of naloxone distributed from these programs are lacking. This study gauges acceptance of naloxone use and monitoring technology among people who use drugs (PWUD), and explores the use of real-time location systems (RTLS) in monitoring naloxone movements. METHODS: A prospective observational study was conducted on a convenience sample of individuals (N = 30) presenting to a tertiary-care academic medical center ED for an opioid-related complaint. A naloxone kit equipped with a low-energy Bluetooth (BLE) tracking system was employed to detect movement of naloxone off the hospital campus as a proxy for community penetrance, followed by a qualitative interview to gauge participant acceptance of naloxone use and monitoring technology. RESULTS: Detection of BLE signals verified transit of 24 distributed naloxone kits off our hospital campus. Three participants whose BLE signals were not captured reported taking their kits with them following discharge, suggesting technological errors occurred; another three participants were lost to follow-up. Qualitative interviews demonstrated that participants accepted ED-based naloxone distribution programs and passive tracking technologies, but revealed concerns regarding hypothetical continuous monitoring systems and problematic interactions with first responders and law enforcement personnel. CONCLUSIONS: Based on acquired BLE signals, 80% of dispensed naloxone kits left the hospital campus. Use of RTLS to passively geolocate naloxone rescue kits is feasible, but detection can be adversely affected by technological errors. PWUD are amenable to transient monitoring technologies but identified barriers to implementation
    corecore